Study: New imaging tech uses radar and AI to see through bandages, monitor wounds

An illustration showing a hand holding a device scanning a wound above the dressings. The words "New would monitoring device without removing bandages" are off to the side.

Enables medical professionals to observe skin without removing dressings; aims to reduce secondary infections and save time for caregivers

Release Date: April 24, 2025

Print
Wenyao Xu.
“Wound monitoring is a strain on hospitals and other health care settings. Our system has great potential to reduce secondary infections, cut costs associated with caring for those infections, and save nurses and other health care providers time so they can concentrate on other duties. ”
Wenyao Xu, professor of computer science and engineering
University at Buffalo School of Engineering and Applied Sciences

BUFFALO, N.Y. — University at Buffalo researchers have created new medical imaging technology that uses radar and artificial intelligence to see through dressings to monitor wounds and other skin conditions.

The system has the potential to reduce the time-consuming process of opening and closing dressings, as well as decrease the odds of secondary infections that result from wounds being exposed.

A study describing the work has been accepted for publication in IEEE Internet of Things Journal and is available on the .

“Wound monitoring is a strain on hospitals and other health care settings,” says the study’s corresponding author Wenyao Xu, PhD, professor of computer science and engineering at UB. “Our system has great potential to reduce secondary infections, cut costs associated with caring for those infections, and save nurses and other health care providers time so they can concentrate on other duties.”

Xu and colleagues call the system mmSkin, named for the millimeter (also called extremely high frequency, or EHF) radio waves that the technology is based upon. EHF waves are used by radar guns, airport security scanners and, more recently, 5G cell phone networks.

mmSkin works by using EHF waves to measure the moisture content of the wound, which is a critical indicator of overall wound health. EHF waves are well-suited for this work because they balance high-resolution imaging with the ability to penetrate gauze.

To make the system work, however, the researchers needed to address two challenges. One is to filter out the surrounding environment, such as walls and other instruments, that can interfere with the radio waves and, thereby, distort the images. The other is turning complex technical data into interpretable images for health care professionals.

To solve the first problem, researchers created a complex algorithm that focuses only on the signal from the wound area. For the latter, they developed and trained an artificial intelligence model that interprets EHF signals and converts them into accurate wound moisture data.

For initial tests, researchers built a prototype mmSkin and created 60 “wound phantoms” of different shapes, sizes and moisture levels. (The phantoms are made of agar powder and water, which is a common method employed to simulate human tissue.) The system was 99.45% accurate in detecting moisture levels, and was able to see through two layers of gauze — a thickness commonly used in wound care.

The team then tested mmSkin on human skin with fake wounds created using ultrasound gel. Again, they found the system to be roughly 95.5% accurate in simulated wounds covered by one to two layers of gauze.

Also impressive, Xu says, is that the system worked well of different skin tones, ages and genders, which suggest its reliability for a diversity of patients.

The researchers are working with UB’s Technology Transfer Office, which has recently filed a Patent Cooperation Treaty (PCT) patent application on the technology. The researchers plan to further refine the system, with the goal reducing the wound scanning time from a minute or more to under 10 seconds and making the images clearer. They’re also planning to test mmSkin on real, complex human wounds.

“Overall, we believe mmSkin is a reliable, non-invasive way to safely assess wounds without removing dressings,” says Xu.

Co-authors of the study include Xiaoyu Zhang and Wei Bo, both PhD candidates in Xu’s lab; and Jun Xia, PhD, professor of biomedical engineering at UB, Yanda Cheng and Chuqin Huang, both PhD candidates in Xia’s lab, and Huijuan Zhang, PhD, a postdoctoral researcher at Harvard University.

Additional co-authors include Zhengxiong Li, PhD (University of Colorado Denver), Chenhan Xu, PhD (North Carolina State University), and Ye Zhan, PhD, of Linde Inc, Tonawanda, New York.

Media Contact Information

Cory Nealon
Director of Media Relations
Engineering, Computer Science
Tel: 716-645-4614
cmnealon@buffalo.edu